Cascade Correlation: Derivation of a More Numerically Stable Update Rule

نویسنده

  • George H. John
چکیده

We discuss the weight update rule in the Cascade Correlation neural net learning algorithm. The weight update rule implements gradient descent optimization of the correlation between a new hidden unit's output and the previous network's error. We present a derivation of the gradient of the correlation function and show that our resulting weight update rule results in slightly faster training. We also show that the new rule is mathematically equivalent to the one presented in the original Cascade Correlation paper and discuss numerical issues underlying the diierence in performance. Since a derivation of the Cascade Correlation weight update rule was not published, this paper should be useful to those who wish to understand the rule.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Derivation of Specific Heat Rejection Correlation in an SI Engine; Experimental and Numerical Study

The thermal balance analysis is a useful method to determine energy distribution and efficiency of internal combustion (IC) engines. In engines cooling concepts, estimation of heat transfer to brake power ratio, as one of the most significant performance characteristics, is highly demanded. In this paper, investigation of energy balance and derivation of specific heat rejection is carried out e...

متن کامل

A numerically stable fast Newton type adaptive filter based on order update fast least squares algorithm

The numerical property of an adaptive filter algorithm is the most important problem in practical applications. Most fast adaptive filter algorithms have the numerical instability problem and the fast Newton transversal filter (FNTF) algorithms are no exception. In this paper, we propose a numerically stable fast Newton type adaptive filter algorithm. Two problems are dealt with in the paper. F...

متن کامل

Neural Network Architectures and Learning

Abstract Various leaning method of neural networks including supervised and unsupervised methods are presented and illustrated with examples. General learning rule as a function of the incoming signals is discussed. Other learning rules such as Hebbian learning, perceptron learning, LMS Least Mean Square learning, delta learning, WTA – Winner Take All learning, and PCA Principal Component Analy...

متن کامل

Cascade of Fractional Differential Equations and Generalized Mittag-Leffler Stability

This paper address a new vision for the generalized Mittag-Leffler stability of the fractional differential equations. We mainly focus on a new method, consisting of decomposing a given fractional differential equation into a cascade of many sub-fractional differential equations. And we propose a procedure for analyzing the generalized Mittag-Leffler stability for the given fractional different...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007